206 research outputs found
Checking Interaction-Based Declassification Policies for Android Using Symbolic Execution
Mobile apps can access a wide variety of secure information, such as contacts
and location. However, current mobile platforms include only coarse access
control mechanisms to protect such data. In this paper, we introduce
interaction-based declassification policies, in which the user's interactions
with the app constrain the release of sensitive information. Our policies are
defined extensionally, so as to be independent of the app's implementation,
based on sequences of security-relevant events that occur in app runs. Policies
use LTL formulae to precisely specify which secret inputs, read at which times,
may be released. We formalize a semantic security condition, interaction-based
noninterference, to define our policies precisely. Finally, we describe a
prototype tool that uses symbolic execution to check interaction-based
declassification policies for Android, and we show that it enforces policies
correctly on a set of apps.Comment: This research was supported in part by NSF grants CNS-1064997 and
1421373, AFOSR grants FA9550-12-1-0334 and FA9550-14-1-0334, a partnership
between UMIACS and the Laboratory for Telecommunication Sciences, and the
National Security Agenc
Automatic mental processes, automatic actions and behaviours in game transfer phenomena: an empirical self-report study using online forum data
Previous studies have demonstrated that the playing of videogames can have both intended and unintended effects. The purpose of this study was to investigate the influence of videogames on players’ mental processes and behaviours in day-to-day settings. A total of 1,023 self-reports from 762 gamers collected from online videogame forums were classified, quantified, described and explained. The data include automatic thoughts, sensations and impulses, automatic mental replays of the game in real life, and voluntary/involuntary behaviours with videogame content. Many gamers reported that they had responded – at least sometimes – to real life stimuli as if they were still playing videogames. This included overreactions, avoidances, and involuntary movements of limbs. These experiences lasted relatively short periods of time but in a minority of players were recurrent. The gamers' experiences appeared to be enhanced by virtual embodiment, repetitive manipulation of game controls, and their gaming habits. However, similar phenomena may also occur when doing other non-gaming activities. The implications of these game transfer experiences are discussed
Evaluating the Cost-Effectiveness of Laryngeal Examination after Elective Total Thyroidectomy
Background Although routine laryngeal examination (RLE) after thyroidectomy may cost more than selective laryngeal examination (SLE), it permits earlier detection and treatment of vocal cord palsy (VCP) and so may be cost-saving in the longer term. We compared the 2-year cost-effectiveness between RLE and SLE with RLE performed at 2 weeks (SLE-2w), 1 month (SLE-1m), and 3 months (SLE-3m) after thyroidectomy in the institution's perspective. Methods Our case definition was a hypothetical 50-year-old woman who underwent an elective total thyroidectomy for a benign multinodular goiter. A decision-analytic model was constructed to compare the estimated cost-effectiveness between RLE, SLE-2w, SLE-1m, and SLE-3m after a 2-year period. Outcome probabilities, utilities, and costs were estimated from the literature. The threshold for cost-effectiveness was set at US302,755, US247,105, respectively. RLE was only cost-effective when the temporary VCP rate increased >42.7 % or when the cost of RLE equaled zero. Similarly, SLE-2w was only cost-effective to SLE-3m when dysphonia for temporary VCP at 3 months increased >39.13 %, dysphonia for permanent VCP at 3 months increased >50.29 %, or dysphonia without VCP at 3 months increased >42.69 %. However, none of these scenarios appeared clinically likely. Conclusions In the institution's perspective, RLE was not cost-effective against the other three SLE strategies. Regarding to the optimal timing of SLE, SLE-3m appears to be a reasonable and acceptable strategy because of its relative low overall cost. © 2014 Society of Surgical Oncology.postprin
Finding and Resolving Security Misusability with Misusability Cases
Although widely used for both security and usability concerns, scenarios used in security design may not necessarily inform the design of usability, and vice- versa. One way of using scenarios to bridge security and usability involves explicitly describing how design deci- sions can lead to users inadvertently exploiting vulnera- bilities to carry out their production tasks. This paper describes how misusability cases, scenarios that describe how design decisions may lead to usability problems sub- sequently leading to system misuse, address this problem. We describe the related work upon which misusability cases are based before presenting the approach, and illus- trating its application using a case study example. Finally, we describe some findings from this approach that further inform the design of usable and secure systems
ASCOT: a text mining-based web-service for efficient search and assisted creation of clinical trials
Clinical trials are mandatory protocols describing medical research on humans and among the most valuable sources of medical practice evidence. Searching for trials relevant to some query is laborious due to the immense number of existing protocols. Apart from search, writing new trials includes composing detailed eligibility criteria, which might be time-consuming, especially for new researchers. In this paper we present ASCOT, an efficient search application customised for clinical trials. ASCOT uses text mining and data mining methods to enrich clinical trials with metadata, that in turn serve as effective tools to narrow down search. In addition, ASCOT integrates a component for recommending eligibility criteria based on a set of selected protocols
Recommended from our members
An exploration of patient-reported symptoms in systemic lupus erythematosus and the relationship to health-related quality of life
Objective: The aim of this study was to explore the most distressing symptoms of systemic lupus erythematosus (SLE) and determine how these relate to health-related quality of life (HRQoL), anxiety/depression, patient demographics and disease characteristics (duration, activity, organ damage).
Methods: In a cross-sectional study, patients with SLE (n=324, age 18-84 years) gave written responses regarding which SLE-related symptoms they experienced as most difficult. Their responses were categorized. Within each category, patients reporting a specific symptom were compared with non-reporters and analyzed for patient demographics, disease duration, results from the questionnaires: Medical Outcomes Study Short-Form 36, Hospital Anxiety and Depression Scale, Systemic Lupus Activity Measure, SLE disease activity index and the Systemic Lupus International Collaboration Clinics/American College of Rheumatology damage index.
Results: 23 symptom categories were identified. Fatigue (51%), Pain (50%) and Musculoskeletal distress (46%) were most frequently reported. Compared with non-reporters, only patients reporting Fatigue showed statistically significant impact on both mental and physical components of HRQoL.. Patients with no present symptoms (10%) had higher HRQoL (p<0.001) and lower levels of depression (p<0.001), anxiety (p<0.01) and disease activity (SLAM) (p<0.001).
Conclusion: Fatigue, pain or musculoskeletal distress dominated the reported symptoms in approximately half of the patients. Only patients reporting Fatigue scored lower on both mental and physical aspects of HRQoL. Our results emphasize the need for further support and interventions to ease the symptom load and improve HRQoL in patients with SLE. Our findings further indicate that this need is particularly urgent for patients with symptoms of pain or fatigue
Sentinel surveillance for human enterovirus 71 in Sarawak, Malaysia: lessons from the first 7 years
BACKGROUND: A major outbreak of human enterovirus 71-associated hand, foot and mouth disease in Sarawak in 1997 marked the beginning of a series of outbreaks in the Asia Pacific region. Some of these outbreaks had unusually high numbers of fatalities and this generated much fear and anxiety in the region. METHODS: We established a sentinel surveillance programme for hand, foot and mouth disease in Sarawak, Malaysia, in March 1998, and the observations of the first 7 years are described here. Virus isolation, serotyping and genotyping were performed on throat, rectal, vesicle and other swabs. RESULTS: During this period Sarawak had two outbreaks of human enterovirus 71, in 2000 and 2003. The predominant strains circulating in the outbreaks of 1997, 2000 and 2003 were all from genogroup B, but the strains isolated during each outbreak were genetically distinct from each other. Human enterovirus 71 outbreaks occurred in a cyclical pattern every three years and Coxsackievirus A16 co-circulated with human enterovirus 71. Although vesicles were most likely to yield an isolate, this sample was not generally available from most cases and obtaining throat swabs was thus found to be the most efficient way to obtain virological information. CONCLUSION: Knowledge of the epidemiology of human enterovirus 71 transmission will allow public health personnel to predict when outbreaks might occur and to plan interventions in an effective manner in order to reduce the burden of disease
Early Outcomes of MDR-TB Treatment in a High HIV-Prevalence Setting in Southern Africa
BACKGROUND: Little is known about treatment of multidrug-resistant tuberculosis (MDR-TB) in high HIV-prevalence settings such as sub-Saharan Africa. METHODOLOGY/PRINCIPAL FINDINGS: We did a retrospective analysis of early outcomes of the first cohort of patients registered in the Lesotho national MDR-TB program between July 21, 2007 and April 21, 2008. Seventy-six patients were included for analysis. Patient follow-up ended when an outcome was recorded, or on October 21, 2008 for those still on treatment. Fifty-six patients (74%) were infected with HIV; the median CD4 cell count was 184 cells/microl (range 5-824 cells/microl). By the end of the follow-up period, study patients had been followed for a median of 252 days (range 12-451 days). Twenty-two patients (29%) had died, and 52 patients (68%) were alive and in treatment. In patients who did not die, culture conversion was documented in 52/54 patients (96%). One patient had defaulted, and one patient had transferred out. Death occurred after a median of 66 days in treatment (range 12-374 days). CONCLUSIONS/SIGNIFICANCE: In a region where clinicians and program managers are increasingly confronted by drug-resistant tuberculosis, this report provides sobering evidence of the difficulty of MDR-TB treatment in high HIV-prevalence settings. In Lesotho, an innovative community-based treatment model that involved social and nutritional support, twice-daily directly observed treatment and early empiric use of second-line TB drugs was successful in reducing mortality of MDR-TB patients. Further research is urgently needed to improve MDR-TB treatment outcomes in high HIV-prevalence settings
Identification of histone modifications in biomedical text for supporting epigenomic research
Corinna K, Klinger R, Hofmann-Apitius M. Identification of Histone Modifications in Biomedical Text for Supporting Epigenomic Research. BMC Bioinformatics. 2009;10(Suppl 1):S28
Incidence and Risk Factors of Serious Adverse Events during Antituberculous Treatment in Rwanda: A Prospective Cohort Study
BACKGROUND: Tuberculosis (TB) and TB-human immunodeficiency virus infection (HIV) coinfection is a major public health concern in resource-limited settings. Although TB treatment is challenging in HIV-infected patients because of treatment interactions, immunopathological reactions, and concurrent infections, few prospective studies have addressed this in sub-Saharan Africa. In this study we aimed to determine incidence, causes of, and risk factors for serious adverse events among patients on first-line antituberculous treatment, as well as its impact on antituberculous treatment outcome. METHODS AND FINDINGS: Prospective observational cohort study of adults treated for TB at the Internal Medicine department of the Kigali University Hospital from May 2008 through August 2009. Of 263 patients enrolled, 253 were retained for analysis: median age 35 (Interquartile range, IQR 28-40), 55% male, 66% HIV-positive with a median CD4 count 104 cells/mm(3) (IQR 44-248 cells/mm(3)). Forty percent had pulmonary TB, 43% extrapulmonary TB and 17% a mixed form. Sixty-four (26%) developed a serious adverse event; 58/167 (35%) HIV-infected vs. 6/86 (7%) HIV-uninfected individuals. Commonest events were concurrent infection (n = 32), drug-induced hepatitis (n = 24) and paradoxical reactions/TB-IRIS (n = 23). HIV-infection (adjusted Hazard Ratio, aHR 3.4, 95% Confidence Interval, CI 1.4-8.7) and extrapulmonary TB (aHR 2, 95%CI 1.1-3.7) were associated with an increased risk of serious adverse events. For TB/HIV co-infected patients, extrapulmonary TB (aHR 2.0, 95%CI 1.1-3.9) and CD4 count <100 cells/mm3 at TB diagnosis (aHR 1.7, 95%CI 1.0-2.9) were independent predictors. Adverse events were associated with an almost two-fold higher risk of unsuccessful treatment outcome at 6 months (HR 1.89, 95%CI 1.3-3.0). CONCLUSION: Adverse events frequently complicate the course of antituberculous treatment and worsen treatment outcome, particularly in patients with extrapulmonary TB and advanced immunodeficiency. Concurrent infection accounts for most events. Our data suggest that deterioration in a patient already receiving antituberculous treatment should prompt an aggressive search for additional infections
- …